Dual Iterative Hard Thresholding: From Non-convex Sparse Minimization to Non-smooth Concave Maximization

نویسندگان

  • Bo Liu
  • Xiao-Tong Yuan
  • Lezi Wang
  • Qingshan Liu
  • Dimitris N. Metaxas
چکیده

Iterative Hard Thresholding (IHT) is a class of projected gradient descent methods for optimizing sparsity-constrained minimization models, with the best known efficiency and scalability in practice. As far as we know, the existing IHT-style methods are designed for sparse minimization in primal form. It remains open to explore duality theory and algorithms in such a non-convex and NP-hard problem setting. In this paper, we bridge this gap by establishing a duality theory for sparsity-constrained minimization with `2-regularized loss function and proposing an IHT-style algorithm for dual maximization. Our sparse duality theory provides a set of sufficient and necessary conditions under which the original NP-hard/non-convex problem can be equivalently solved in a dual formulation. The proposed dual IHT algorithm is a supergradient method for maximizing the non-smooth dual objective. An interesting finding is that the sparse recovery performance of dual IHT is invariant to the Restricted Isometry Property (RIP), which is required by virtually all the existing primal IHT algorithms without sparsity relaxation. Moreover, a stochastic variant of dual IHT is proposed for large-scale stochastic optimization. Numerical results demonstrate the superiority of dual IHT algorithms to the state-of-the-art primal IHT-style algorithms in model estimation accuracy and computational efficiency.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Smooth Optimization Approach for Sparse Covariance Selection

In this paper we first study a smooth optimization approach for solving a class of non-smooth strictly concave maximization problems whose objective functions admit smooth convex minimization reformulations. In particular, we apply Nesterov’s smooth optimization technique [19, 21] to their dual counterparts that are smooth convex problems. It is shown that the resulting approach has O(1/√ǫ) ite...

متن کامل

Minimization of Non-smooth, Non-convex Functionals by Iterative Thresholding

Numerical algorithms for a special class of non-smooth and non-convex minimization problems in infinite dimensional Hilbert spaces are considered. The functionals under consideration are the sum of a smooth and non-smooth functional, both possibly non-convex. We propose a generalization of the gradient projection method and analyze its convergence properties. For separable constraints in the se...

متن کامل

Conjugate gradient acceleration of iteratively re-weighted least squares methods

Iteratively Re-weighted Least Squares (IRLS) is a method for solving minimization problems involving non-quadratic cost functions, perhaps non-convex and non-smooth, which however can be described as the infimum over a family of quadratic functions. This transformation suggests an algorithmic scheme that solves a sequence of quadratic problems to be tackled efficiently by tools of numerical lin...

متن کامل

Iterative $\ell_1$ minimization for non-convex compressed sensing

An algorithmic framework, based on the difference of convex functions algorithm, is proposed for minimizing a class of concave sparse metrics for compressed sensing problems. The resulting algorithm iterates a sequence of l1 minimization problems. An exact sparse recovery theory is established to show that the proposed framework always improves on the basis pursuit (l1 minimization) and inherit...

متن کامل

On the Convergence of the Concave-Convex Procedure

The concave-convex procedure (CCCP) is a majorization-minimization algorithm that solves d.c. (difference of convex functions) programs as a sequence of convex programs. In machine learning, CCCP is extensively used in many learning algorithms like sparse support vector machines (SVMs), transductive SVMs, sparse principal component analysis, etc. Though widely used in many applications, the con...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017